目录

1 XGBoost的优势

2 XGBoost的工作原理

3 XGBoost的参数

XGBoost的参数可以被分为3类:

1.General Parameters

1.Booster[default=gbtree]

设置booster类型(gbtree, gblinear, dart)。对于分类问题,可以使用gbtree, gblinear;对于回归问题,可以使用任何类型。

2.nthread[default=maximum cores available]

启动并行计算。通常不需要改变这一参数,因为默认使用所有核,可以带来最快的计算速度。

3.silent[default=0]

如果设为1,R console会被运行信息淹没,最好不要改变改变这一参数。

2.Booster Parameters

2.1 Parameters for Tree Booster

1.nrounds[default=100]
  • 控制最大迭代次数。对于分类问题,相当于建立的树的棵树
  • 使用CV进行调参
2.eta[default=0.3][range: (0,1)]
  • 控制学习速率,即模型学习数据模式的速率。每一轮次后,模型都会压缩特征权重以达到最优化
  • 较低的eta降低计算速度,需要配合更高的nrounds
  • 典型的取值在0.01 - 0.3
3.gamma[default=0][range: (0,Inf)]
  • 控制正则化(防止过拟合)。gamma的最优取值取决于数据集和其他参数值
  • 取值越高,正则化力度越大。正则化意味着对没有改善模型表现且取值较大的参数施以惩罚。默认值为0,意味着没有正则化
  • 调参技巧:先将参数值设为0。检查交叉验证错误率,如果train error >>> test error,则引入gamma参数。gamma值越高,训练集和测试集的差距越小。如果树的深度(max_depth)较小,gamma参数会带来性能的提升
4.max_depth[default=6][range: (0,Inf)]
  • 控制树的深度
  • 树的深度越大,模型越复杂,过拟合的可能性越大。这一参数没有标准取值。更大的数据集,需要更深的树深度。
  • 使用CV进行调参
5.min_child_weight[default=1][range:(0,Inf)]
  • 在回归中,代表每一个子节点中的最小样本数。在分类中,如果子节点的样本权重之和(由二阶偏导数计算得到)小于该参数值,则停止分裂
6.subsample[default=1][range: (0,1)]
  • 控制每一棵树使用的样本数
  • 典型的取值在0.5 - 0.8
7.colsample_bytree[default=1][range: (0,1)]
  • 控制每一棵树使用的特征数
  • 典型的取值在0.5 - 0.9
8.lambda[default=1]
  • 控制L2正则化(相当于岭回归),用来防止过拟合
9.alpha[default=0]
  • 控制L1正则化(相当于lasso回归),用来防止过拟合,还可以用来做特征选择,在高维数据集中更有用
10.scale_pos_weight[default=1]
  • 控制正负例的权重,对不平衡数据集有用。可以考虑设置为负例数/正例数。

2.2 Parameters for Linear Booster

1.nrounds[default=100]
  • 控制最大迭代次数
  • 使用CV进行调参
2.lambda[default=0]
  • 控制L2正则化,即岭回归,用来防止过拟合
3.alpha[default=0]
  • 控制L1正则化,即lasso回归,用来防止过拟合

2.3 Learning Task Parameters

1.Objective[default=reg:linear]
  • reg:linear - for linear regression
  • binary:logistic - logistic regression for binary classification. It returns class probabilities
  • multi:softmax - multiclassification using softmax objective. It returns predicted class labels. It requires setting num_class parameter denoting number of unique prediction classes.
  • multi:softprob - multiclassification using softmax objective. It returns predicted class probabilities.
2.eval_metric [no default, depends on objective selected]
  • 用来评估模型在验证集上的准确率。对于回归问题,默认评价指标为RMSE;对于分类问题,默认评价指标为error
  • 可选的评价指标如下:
    • mae - Mean Absolute Error (used in regression)
    • Logloss - Negative loglikelihood (used in classification)
    • AUC - Area under curve (used in classification)
    • RMSE - Root mean square error (used in regression)
    • error - Binary classification error rate [#wrong cases/#all cases]
    • mlogloss - multiclass logloss (used in classification)

4 XGBoost在R中的调参实践

调参策略:

4.0 AUC & KS函数、网格搜索函数、载入数据集

4.0.1 AUC & KS函数

library(ROCR)
calc_auc_and_ks <- function(pred, y) {
  pred.obj1 <- ROCR::prediction(pred, y)
  
  ## AUC
  auc.tmp1 <- performance(pred.obj1, "auc")
  auc1 <- as.numeric(auc.tmp1@y.values)
  
  ## KS
  roc.tmp1 <- performance(pred.obj1, "tpr", "fpr")
  ks <- max(attr(roc.tmp1, "y.values")[[1]] - attr(roc.tmp1, "x.values")[[1]])
  
  # print(c(auc1, ks))
  return(list(auc1, ks))
}
# get cv-auc, cv-ks:
# cv_prediction <- xgb$pred
# calc_auc_and_ks(cv_prediction, y_train)

4.0.2 网格搜索函数

grid_search <- function(dtrain, y_train,
                        seed = 10, nthread=20, missing=NA, nrounds=10000, early_stopping_rounds=50, nfold=5, stratified=T, verbose=F, prediction = T,
                        eta=c(0.1),
                        max_depth=c(6),
                        min_child_weight=c(1),
                        gamma= c(0),
                        subsample=c(1),
                        colsample_bytree =c(1),
                        lambda = c(1),
                        alpha =c(0),
                        scale_pos_weight=c(1)){
  # create output data.frame:param(sep by ,), auc, ks,  auc_rank, ks_rank.
  output_df <- data.frame(t(rep(NA,11)))
  names(output_df) <- c("eta", "max_depth","min_child_weight","gamma","subsample","colsample_bytree","lambda","alpha","scale_pos_weight","cv_auc","cv_ks")
  rowkey <-1
  
  # create parameters grid
  to_tune = expand.grid(eta = eta,
                        max_depth = max_depth,
                        min_child_weight = min_child_weight,
                        gamma = gamma,
                        subsample = subsample,
                        colsample_bytree = colsample_bytree,
                        lambda = lambda,
                        alpha = alpha,
                        scale_pos_weight = scale_pos_weight)
  # for loop
  for (i in seq(dim(to_tune)[1])) {
    
    xgb_params = list(
      objective = "binary:logistic",
        eval_metric = 'auc')
      xgb_params$eta = to_tune[i, 1]
      xgb_params$max_depth = to_tune[i, 2]
      xgb_params$min_child_weight = to_tune[i, 3]
      xgb_params$gamma = to_tune[i, 4]
      xgb_params$subsample = to_tune[i, 5]
      xgb_params$colsample_bytree = to_tune[i, 6]
      xgb_params$lambda = to_tune[i, 7]
      xgb_params$alpha = to_tune[i, 8]
      xgb_params$scale_pos_weight = to_tune[i, 9]
    set.seed(seed)
    start_tm <-Sys.time()
    xgb = xgb.cv(data = dtrain,
                 params = xgb_params,
                 nthread = nthread,
                 missing = missing,
                 nrounds = nrounds,
                 early_stopping_rounds = early_stopping_rounds,
                 nfold = nfold,
                 stratified = stratified,
                 verbose = verbose,
                 prediction = prediction
        )
    end_tm<-Sys.time()
#   print(paste0(rowkey, ' run time:', end_tm - start_tm))
    # get cv-auc, cv-ks:
    cv_prediction <- xgb$pred
    list_auc_ks <- calc_auc_and_ks(cv_prediction, y_train)
    auc_i <- list_auc_ks[[1]]
    ks_i <- list_auc_ks[[2]]
    
    # fill the output_df
    output_df[rowkey,] <- as.data.frame(t(c(xgb_params$eta, xgb_params$max_depth, xgb_params$min_child_weight, xgb_params$gamma, xgb_params$subsample, xgb_params$colsample_bytree, xgb_params$lambda, xgb_params$alpha, xgb_params$scale_pos_weight, auc_i, ks_i)))
    
#   print(paste0("rowkey=", rowkey, ">>parameters:eta=", xgb_params$eta, ",max_depth=", xgb_params$max_depth, ",min_child_weight=", xgb_params$min_child_weight, ",gamma=", xgb_params$gamma, ",subsample=", xgb_params$subsample, ",colsample_bytree=", xgb_params$colsample_bytree, ",lambda=", xgb_params$lambda, ",alpha=", xgb_params$alpha, ",scale_pos_weight=", xgb_params$scale_pos_weight))
#   print(paste0("rowkey=",rowkey,": auc=",auc_i,", ks=",ks_i,"...."))
    
    rowkey <- rowkey +1
  }
  # Rank the auc, ks descending.
  output_df[,"desc_rank_auc"] <- dim(output_df)[1]+1 - as.data.frame(rank(output_df[,"cv_auc"]))
  output_df[,"desc_rank_ks"] <- dim(output_df)[1]+1 - as.data.frame(rank(output_df[,"cv_ks"]))
  return(output_df)
}

4.0.3 载入数据集

library(xgboost)
library(dplyr)
df_train = read.csv("data/cs-training.csv", stringsAsFactors = FALSE) %>%
  na.omit() %>%  # delete the missing value
  select(-`X`)   # delete the first index column
train_data = as.matrix(df_train %>% select(-SeriousDlqin2yrs))
train_label = df_train$SeriousDlqin2yrs
dtrain <- xgb.DMatrix(data = train_data, label = train_label)

4.1 在较高的学习速率下,进行决策树参数调优

max_depth 、 min_child_weight 、 gamma 、 subsample 、 colsample_bytree

4.1.1 max_depth 和 min_child_weight 参数调优

先大范围地粗调参数,然后再小范围地微调;取决于机器的性能,可以适当放宽网格搜索的范围、减少参数的步长。

grid_search_result = grid_search(dtrain, train_label,
                                 eta = c(0.1),
                                 max_depth = c(3, 5, 7, 9),          # initial value set to [3-9]
                                 min_child_weight = c(1, 3, 5),      # initial value set to [1-5]
                                 gamma = c(0),                       # initial value set to 0
                                 subsample = c(0.8),                 # typical initial value set to 0.8, can be set to [0.5, 0.9]
                                 colsample_bytree = c(0.8),          # typical initial value set to 0.8, can be set to [0.5, 0.9]
                                 lambda = c(1),
                                 alpha = c(0),
                                 scale_pos_weight = c(1))
grid_search_result
grid_search_result %>% filter(desc_rank_auc == 1)

理想的max_depth值为3,理想的min_child_weight值为5,但是我们还没尝试过小于3的max_depth取值和大于5的min_child_weight取值,所以继续在这一参数组合附近搜索。

grid_search_result = grid_search(dtrain, train_label,
                                 eta = c(0.1),
                                 max_depth = c(1, 3, 5),
                                 min_child_weight = c(3, 5, 7),
                                 gamma = c(0),
                                 subsample = c(0.8),
                                 colsample_bytree = c(0.8),
                                 lambda = c(1),
                                 alpha = c(0),
                                 scale_pos_weight = c(1))
grid_search_result
grid_search_result %>% filter(desc_rank_auc == 1)

理想的max_depth值仍然为3,理想的min_child_weight值仍然为5。在这个参数组合附近进一步调整,将步长设置为1,寻找理想的参数组合。

grid_search_result = grid_search(dtrain, train_label,
                                 eta = c(0.1),
                                 max_depth = c(2, 3, 4),
                                 min_child_weight = c(4, 5, 6),
                                 gamma = c(0),
                                 subsample = c(0.8),
                                 colsample_bytree = c(0.8),
                                 lambda = c(1),
                                 alpha = c(0),
                                 scale_pos_weight = c(1))
grid_search_result
grid_search_result %>% filter(desc_rank_auc == 1)

最终确定,理想的max_depth值为3,理想的min_child_weight值为5。

4.1.2 gamma 参数调优

  • Gamma Tuning
    • Always start with 0, use xgb.cv, and look how the train/test are faring. If you train CV skyrocketing over test CV at a blazing speed, this is where Gamma is useful instead of min_child_weight (because you need to control the complexity issued from the loss, not the loss derivative from the hessian weight in min_child_weight). Another choice typical and most preferred choice: step max_depth down.
    • If Gamma is useful (i.e train CV skyrockets at godlike speed when test CV can’t follow), crank up Gamma. This is where the experience with tuning Gamma is useful (so you lose the lowest amount of time). Depending on what you see between the train/test CV increase speed, you try to find an appropriate Gamma. The higher the Gamma, the lower the difference between train/test CV will happen. If you have no idea of the value to use, put 10 and look what happens.
  • How to set Gamma values?
    • If your train/test CV are always lying too close, it means you controlled way too much the complexity of xgboost, and the model can’t grow trees without pruning them (due to the loss threshold not reached thanks to Gamma). Lower Gamma (good relative value to reduce if you don’t know: cut 20% of Gamma away until you test CV grows without having the train CV frozen).
    • If your train/test CV are differing too much, it means you did not control enough the complexity of xgboost, and the model grows too many trees without pruning them (due to the loss threshold not reached because of Gamma). Put a higher Gamma (good absolute value to use if you don’t know: +2, until your test CV can follow faster your train CV which goes slower, your test CV should be able to peak).
    • If your train CV is stuck (not increasing, or increasing way too slowly), decrease Gamma: that value was too high and xgboost keeps pruning trees until it can find something appropriate (or it may end in an endless loop of testing + adding nodes but pruning them straight away…).
grid_search_result = grid_search(dtrain, train_label,
                                 eta = c(0.1),
                                 max_depth = c(3),
                                 min_child_weight = c(5),
                                 gamma = c(0, 0.01, 0.1, 1, 3, 5, 10, 20),
                                 subsample = c(0.8),
                                 colsample_bytree = c(0.8),
                                 lambda = c(1),
                                 alpha = c(0),
                                 scale_pos_weight = c(1))
grid_search_result
grid_search_result %>% filter(desc_rank_auc == 1)

理想的gamma值为3,在这一参数值附近进一步调整,将步长设置为1。

grid_search_result = grid_search(dtrain, train_label,
                                 eta = c(0.1),
                                 max_depth = c(3),
                                 min_child_weight = c(5),
                                 gamma = c(2, 3, 4),
                                 subsample = c(0.8),
                                 colsample_bytree = c(0.8),
                                 lambda = c(1),
                                 alpha = c(0),
                                 scale_pos_weight = c(1))
grid_search_result
grid_search_result %>% filter(desc_rank_auc == 1)

最终确定,理想的gamma值为3。

4.1.3 subsample 和 colsample_bytree 参数调优

在0.6到1.0的范围内,以0.1为步长,对subsample和colsample_bytree进行网格搜索。

grid_search_result = grid_search(dtrain, train_label,
                                 eta = c(0.1),
                                 max_depth = c(3),
                                 min_child_weight = c(5),
                                 gamma = c(3),
                                 subsample = c(0.6, 0.7, 0.8, 0.9, 1.0),
                                 colsample_bytree = c(0.6, 0.7, 0.8, 0.9, 1.0),
                                 lambda = c(1),
                                 alpha = c(0),
                                 scale_pos_weight = c(1))
grid_search_result
grid_search_result %>% filter(desc_rank_auc == 1)

最终确定,理想的subsample值为0.7,理想的colsample_bytree值为0.8。

4.2 alpha 和 lambda 正则化参数调优

gamma参数提供了一种更加有效地降低过拟合的方法,因而alpha和lambda参数可以进行较为粗略的调整。

grid_search_result = grid_search(dtrain, train_label,
                                 eta = c(0.1),
                                 max_depth = c(3),
                                 min_child_weight = c(5),
                                 gamma = c(3),
                                 subsample = c(0.7),
                                 colsample_bytree = c(0.8),
                                 lambda = c(0, 1e-5, 1e-2, 0.1, 1, 100),
                                 alpha = c(0, 1e-5, 1e-2, 0.1, 1, 100),
                                 scale_pos_weight = c(1))
grid_search_result
grid_search_result %>% filter(desc_rank_auc == 1)

理想的lambda值为100,理想的alpha值为1。

4.3 对所有参数进行网格搜索

weight = (length(train_label) - sum(train_label)) / sum(train_label)
grid_search_result = grid_search(dtrain, train_label,
                                 eta = c(0.1),
                                 max_depth = c(2, 3, 4),
                                 min_child_weight = c(4, 5, 6),
                                 gamma = c(2, 3, 4),
                                 subsample = c(0.6, 0.7, 0.8),
                                 colsample_bytree = c(0.7, 0.8, 0.9),
                                 lambda = c(100),
                                 alpha = c(1),
                                 scale_pos_weight = c(1, weight))
grid_search_result
grid_search_result %>% filter(desc_rank_auc == 1)

最终确定,max_depth值为4,min_child_weight值为5,gamma值为3,subsample值为0.6,colsample_bytree为0.9,lambda值为100,alpha值为1,scale_pos_weight值为1。

4.4 降低学习速率

以10倍为步长降低学习速率,eta分别取0.001, 0.01, 0.1进行搜索。

grid_search_result = grid_search(dtrain, train_label,
                                 eta = c(0.001, 0.01, 0.1),
                                 max_depth = c(4),
                                 min_child_weight = c(5),
                                 gamma = c(3),
                                 subsample = c(0.6),
                                 colsample_bytree = c(0.9),
                                 lambda = c(100),
                                 alpha = c(1),
                                 scale_pos_weight = c(1))
grid_search_result
grid_search_result %>% filter(desc_rank_auc == 1)

理想的eta值为0.1,在0.1附近以0.05为步长进一步搜索。

grid_search_result = grid_search(dtrain, train_label,
                                 eta = c(0.05, 0.1, 0.15, 0.2, 0.25, 0.3),
                                 max_depth = c(4),
                                 min_child_weight = c(5),
                                 gamma = c(3),
                                 subsample = c(0.6),
                                 colsample_bytree = c(0.9),
                                 lambda = c(100),
                                 alpha = c(1),
                                 scale_pos_weight = c(1))
grid_search_result
grid_search_result %>% filter(desc_rank_auc == 1)

理想的eta值为0.1,在0.1附近以0.01为步长进一步搜索。

grid_search_result = grid_search(dtrain, train_label,
                                 eta = c(0.06, 0.07, 0.08, 0.09, 0.1, 0.11, 0.12, 0.13, 0.14),
                                 max_depth = c(4),
                                 min_child_weight = c(5),
                                 gamma = c(3),
                                 subsample = c(0.6),
                                 colsample_bytree = c(0.9),
                                 lambda = c(100),
                                 alpha = c(1),
                                 scale_pos_weight = c(1))
grid_search_result
grid_search_result %>% filter(desc_rank_auc == 1)

最终确定,理想的eta值为0.12,找出该eta值下对应的nrounds。

bst_params = list(objective = "binary:logistic",
                  eval_metric = 'auc',
                  eta = c(0.12),
                  max_depth = c(4),
                  min_child_weight = c(5),
                  gamma = c(3),
                  subsample = c(0.6),
                  colsample_bytree = c(0.9),
                  lambda = c(100),
                  alpha = c(1),
                  scale_pos_weight = c(1))
set.seed(10)
bst_cv = xgb.cv(data = dtrain,
                 params = bst_params,
              #  nthread = 20,
                 missing = NA,
                 nrounds = 10000,
                 early_stopping_rounds = 50,
                 nfold = 5,
                 stratified = T,
                 verbose = F,
                 prediction = T
    )
calc_auc_and_ks(bst_cv$pred, train_label)
[[1]]
[1] 0.8566172

[[2]]
[1] 0.5591036
bst_cv$niter
[1] 259
bst_cv$evaluation_log

最终确定,理想的nrounds为259。

5 模型评价

5.1 混淆矩阵

library(caret)
library(e1071)
xgb_pred <- ifelse(bst_cv$pred > 0.5, 1, 0)
confusionMatrix(xgb_pred, train_label)
Confusion Matrix and Statistics

          Reference
Prediction      0      1
         0 110803   6741
         1   1109   1616
                                          
               Accuracy : 0.9347          
                 95% CI : (0.9333, 0.9361)
    No Information Rate : 0.9305          
    P-Value [Acc > NIR] : 3.372e-09       
                                          
                  Kappa : 0.2666          
 Mcnemar's Test P-Value : < 2.2e-16       
                                          
            Sensitivity : 0.9901          
            Specificity : 0.1934          
         Pos Pred Value : 0.9427          
         Neg Pred Value : 0.5930          
             Prevalence : 0.9305          
         Detection Rate : 0.9213          
   Detection Prevalence : 0.9773          
      Balanced Accuracy : 0.5917          
                                          
       'Positive' Class : 0               
                                          

5.2 ROC曲线

library(pROC)
modelroc = roc(train_label, bst_cv$pred, thresholds = 0.5)
plot(modelroc,print.auc=T,auc.polygon=T,grid=c(0.1,0.2),grid.col=c('green','red'),max.auc.polygon=T,auc.polygon.col='skyblue',print.thres=T)

5.3 训练集、交叉验证AUC相对于训练轮次的变化趋势图

library(tidyr)
bst_cv$evaluation_log %>%
  select(-contains("std")) %>%
  gather(TestOrTrain, AUC, -iter) %>%
  ggplot(aes(x = iter, y = AUC, group = TestOrTrain, color = TestOrTrain)) + 
  geom_line() + 
  theme_bw()

6 模型训练

使用调节好的参数,通过xgb.train,在全部训练集上训练模型

bst_params = list(objective = "binary:logistic",
                  eval_metric = 'auc',
                  eta = c(0.12),
                  max_depth = c(4),
                  min_child_weight = c(5),
                  gamma = c(3),
                  subsample = c(0.6),
                  colsample_bytree = c(0.9),
                  lambda = c(100),
                  alpha = c(1),
                  scale_pos_weight = c(1))
bst = xgb.train(data = dtrain,
                params = bst_params,
                # nthread = 20,
                missing = NA,
                nrounds = 259,
                verbose = F
                )

7 模型可解释性

7.1 特征重要性评分

importance <- xgb.importance(feature_names = colnames(train_data), model = bst)
importance
  • Gain is the improvement in accuracy brought by a feature to the branches it is on. The idea is that before adding a new split on a feature X to the branch there was some wrongly classified elements, after adding the split on this feature, there are two new branches, and each of these branch is more accurate (one branch saying if your observation is on this branch then it should be classified as 1, and the other branch saying the exact opposite).

  • Cover measures the relative quantity of observations concerned by a feature.

  • Frequency is a simpler way to measure the Gain. It just counts the number of times a feature is used in all generated trees. You should not use it (unless you know why you want to use it).

7.2 绘制特征重要性评分

xgb.plot.importance(importance_matrix = importance) 

7.3 解释变量的作用

importanceRaw <- xgb.importance(feature_names = colnames(train_data), model = bst, data = train_data, label = train_label)
with=FALSE ignored, it isn't needed when using :=. See ?':=' for examples.
importanceClean <- importanceRaw[,`:=`(Cover=NULL, Frequency=NULL)]

7.4 绘制模型

bst_params = list(objective = "binary:logistic",
                  eval_metric = 'auc',
                  eta = c(0.12),
                  max_depth = c(4),
                  min_child_weight = c(5),
                  gamma = c(3),
                  subsample = c(0.6),
                  colsample_bytree = c(0.9),
                  lambda = c(100),
                  alpha = c(1),
                  scale_pos_weight = c(1))
bst_graph = xgb.train(data = dtrain,
                      params = bst_params,
                      # nthread = 20,
                      missing = NA,
                      nrounds = 2,
                      verbose = F
                      )
library(DiagrammeR)
xgb.plot.tree(model = bst_graph)

8 随机森林

随机森林和梯度提升决策树都属于集成算法。两种算法都要在一个数据集上训练多个决策树,两者的区别在于:随机森林中的每一棵决策树是独立的,而在梯度提升决策树中,每一棵树都是在对前一棵树进行修正。

8.1 模型训练

通过XGBoost也可以实现随机森林,下面建立由1000棵决策树组成的随机森林。

rf <- xgb.train(data = dtrain, max_depth = 4, num_parallel_tree = 1000, subsample = 0.8, colsample_bytree =0.8, nrounds = 1, objective = "binary:logistic", eval_metric = 'auc')

8.2 模型评价

rf_pred <- predict(rf, dtrain)
calc_auc_and_ks(rf_pred, train_label)
[[1]]
[1] 0.8483992

[[2]]
[1] 0.5391836
modelroc = roc(train_label, rf_pred, thresholds = 0.5)
plot(modelroc,print.auc=T,auc.polygon=T,grid=c(0.1,0.2),grid.col=c('green','red'),max.auc.polygon=T,auc.polygon.col='skyblue',print.thres=T)

rf_pred <- predict(bst, dtrain)
rf_pred <- if_else(rf_pred > 0.5, 1, 0)
confusionMatrix(rf_pred, train_label)
Confusion Matrix and Statistics

          Reference
Prediction      0      1
         0 111396   7377
         1    516    980
                                         
               Accuracy : 0.9344         
                 95% CI : (0.933, 0.9358)
    No Information Rate : 0.9305         
    P-Value [Acc > NIR] : 5.774e-08      
                                         
                  Kappa : 0.1817         
 Mcnemar's Test P-Value : < 2.2e-16      
                                         
            Sensitivity : 0.9954         
            Specificity : 0.1173         
         Pos Pred Value : 0.9379         
         Neg Pred Value : 0.6551         
             Prevalence : 0.9305         
         Detection Rate : 0.9262         
   Detection Prevalence : 0.9876         
      Balanced Accuracy : 0.5563         
                                         
       'Positive' Class : 0              
                                         

8.3 特征重要性评分

importance <- xgb.importance(feature_names = colnames(train_data), model = rf)
importance
xgb.plot.importance(importance_matrix = importance) 

LS0tDQp0aXRsZTogIlhHQm9vc3QgYW5kIFBhcmFtZXRlciBUdW5pbmcgaW4gUiINCm91dHB1dDogaHRtbF9ub3RlYm9vaw0KLS0tDQoNCsS/wrwNCg0KKyAxIFhHQm9vc3S1xNPFysYNCisgMiBYR0Jvb3N0tcS5pNf31K3A7Q0KKyAzIFhHQm9vc3S1xLLOyv0NCisgNCBYR0Jvb3N01NpS1tC1xLX3ss7Ktbz5DQogICAgKyA0LjAgQVVDICYgS1O6r8r9oaLN+Ljxy9HL97qvyv2hotTYyOvK/b7dvK8NCiAgICAgICAgKyA0LjAuMSBBVUMgJiBLU7qvyv0NCiAgICAgICAgKyA0LjAuMiDN+Ljxy9HL97qvyv0NCiAgICAgICAgKyA0LjAuMyDU2Mjryv2+3byvDQogICAgKyA0LjEg1Nq9z7jftcTRp8+wy9nCys/Co6y9+NDQvvay38r3ss7K/bX308UNCiAgICAgICAgKyA0LjEuMSBtYXhfZGVwdGggus0gbWluX2NoaWxkX3dlaWdodCCyzsr9tffTxQ0KICAgICAgICArIDQuMS4yIGdhbW1hILLOyv2199PFDQogICAgICAgICsgNC4xLjMgc3Vic2FtcGxlILrNIGNvbHNhbXBsZV9ieXRyZWUgss7K/bX308UNCiAgICArIDQuMiBhbHBoYSC6zSBsYW1iZGEg1f3U8ruvss7K/bX308UNCiAgICArIDQuMyC21Mv509Cyzsr9vfjQ0M34uPHL0cv3DQogICAgKyA0LjQgvbW1zdGnz7DL2cLKDQorIDUgxKPQzcbAvNsNCiAgICArIDUuMSC77M/9vtjV8w0KICAgICsgNS4yIFJPQ8f6z98NCiAgICArIDUuMyDRtcG3vK+hor27subR6dakQVVDz+C21NPa0bXBt8LWtM61xLHku6/H98rGzbwNCisgNiDEo9DN0bXBtw0KKyA3IMSj0M2/yb3iys3Q1A0KICAgICsgNy4xIMzY1ffW2NKq0NTGwLfWDQogICAgKyA3LjIgu+bWxszY1ffW2NKq0NTGwLfWDQogICAgKyA3LjMgveLKzbHkwb+1xNf308MNCiAgICArIDcuNCC75tbGxKPQzQ0KKyA4IMvmu/rJrcHWDQogICAgKyA4LjEgxKPQzdG1wbcNCiAgICArIDguMiDEo9DNxsC82w0KICAgICsgOC4zIMzY1ffW2NKq0NTGwLfWDQoNCiMgMSBYR0Jvb3N0tcTTxcrGDQorIDEusqLQ0LzGy+Ojusq508NPcGVuTVC9+NDQsqLQ0LzGy+OjrMSsyM/KudPDvMbL47v6tcTL+dPQussNCg0KKyAyLtX91PK7r6O6WEdCb29zdNfutPO1xNPFysbU2tPav8nS1NX91PK7r6Ost8DWubn9xOK6zw0KDQorIDMuvbuy5tHp1qSjulhHQm9vc3TE2tbDvbuy5tHp1qS6r8r9DQoNCisgNC7Iscqn1rWjulhHQm9vc3S/ydLUtKbA7cixyqfWtaOsxKPQzb/J0tSytte9tb3Iscqn1rXUzLqstcTH98rGDQoNCisgNS7B6bvu0NSjutans9bTw7un19S2qNLlxL+x6rqvyv26zcbAvNvWuLHqDQoNCisgNi6/ybvxtcPQ1KO6WEdCb29zdNans9ZSLCBQeXRob24sIEphdmEsIEp1bGlhLCBTY2FsYSC1yNPv0dQNCg0KKyA3LrGjtOa6zdTYyOujulhHQm9vc3S/ydLUsaO05rrN1NjI68r9vt2+2NXzus3Eo9DNDQoNCisgOC689Namo7pYR0Jvb3N0ytfPyL2owaLX7rTzye62yLXEyvejrNTZ19TPwrb4yc+89Mily/DKp7qvyv21xLz1ydm1zdPa49DWtbXEyvfWpg0KDQojIDIgWEdCb29zdLXEuaTX99StwO0NCisgMS631sDgzsrM4qO6yrnTw2Bib29zdGVyID0gZ2J0cmVlYLLOyv2ho8O/0ru/w8r3trzKx9Ta1q7HsMr3tcS7+bShyc+9qMGio6zNqLn9uPjWrsewtcTK987zt9a1xLXjuLPT6Lj8uN+1xMio1tijrMC0vbW1zb3Tz8LAtMLWtM7W0LXEzvO31sLKoaMNCg0KKyAyLrvYuenOyszio7rKudPDYGJvb3N0ZXIgPSBnYnRyZWVgILrNIGBib29zdGVyID0gZ2JsaW5lYXJgss7K/aGjyrnTw2BnYmxpbmVhcmCyzsr9yrGjrL2owaK549Llz9/Q1MSj0M2jrLKiyrnTw6OoTDEsTDKjqdX91PK7r7rNzN22yM/CvbW3qKGjuvPQ+LXExKPQzba8yse21Naux7DEo9DNtcSy0LLuvfjQ0MTius+how0KDQojIDMgWEdCb29zdLXEss7K/Q0KWEdCb29zdLXEss7K/b/J0tSxu7fWzqozwOCjuiAgDQoNCisgR2VuZXJhbCBQYXJhbWV0ZXJzOiBDb250cm9scyB0aGUgYm9vc3RlciB0eXBlIGluIHRoZSBtb2RlbCB3aGljaCBldmVudHVhbGx5IGRyaXZlcyBvdmVyYWxsIGZ1bmN0aW9uaW5nDQorIEJvb3N0ZXIgUGFyYW1ldGVyczogQ29udHJvbHMgdGhlIHBlcmZvcm1hbmNlIG9mIHRoZSBzZWxlY3RlZCBib29zdGVyDQorIExlYXJuaW5nIFRhc2sgUGFyYW1ldGVyczogU2V0cyBhbmQgZXZhbHVhdGVzIHRoZSBsZWFybmluZyBwcm9jZXNzIG9mIHRoZSBib29zdGVyIGZyb20gdGhlIGdpdmVuIGRhdGENCg0KIyMjIDEuR2VuZXJhbCBQYXJhbWV0ZXJzDQojIyMjIyAxLkJvb3N0ZXJbZGVmYXVsdD1nYnRyZWVdDQrJ6NbDYm9vc3RlcsDg0M2jqGdidHJlZSwgZ2JsaW5lYXIsIGRhcnSjqaGjttTT2rfWwODOyszio6y/ydLUyrnTw2didHJlZSwgZ2JsaW5lYXKju7bU09q72LnpzsrM4qOsv8nS1Mq508PIzrrOwODQzaGjDQoNCiMjIyMjIDIubnRocmVhZFtkZWZhdWx0PW1heGltdW0gY29yZXMgYXZhaWxhYmxlXQ0KxvS2r7Ki0NC8xsvjoaPNqLOjsrvQ6NKquMSx5NXi0ruyzsr9o6zS8s6qxKzIz8q508PL+dPQusujrL/J0tS0+MC01+6/7LXEvMbL48vZtsihow0KDQojIyMjIyAzLnNpbGVudFtkZWZhdWx0PTBdDQrI57n7yejOqjGjrFIgY29uc29sZbvhsbvUy9DQ0MXPotHNw7ujrNfuusOyu9KquMSx5LjEseTV4tK7ss7K/aGjDQoNCiMjIyAyLkJvb3N0ZXIgUGFyYW1ldGVycw0KIyMjIyAyLjEgUGFyYW1ldGVycyBmb3IgVHJlZSBCb29zdGVyDQojIyMjIyAxLm5yb3VuZHNbZGVmYXVsdD0xMDBdDQorIL/Y1sbX7rTztfy0+rTOyv2ho7bU09q31sDgzsrM4qOsz+C1sdPavajBorXEyve1xL/DyvcNCisgyrnTw0NWvfjQ0LX3ss4NCg0KIyMjIyMgMi5ldGFbZGVmYXVsdD0wLjNdW3JhbmdlOiAoMCwxKV0NCisgv9jWxtGnz7DL2cLKo6y8tMSj0M3Rp8+wyv2+3cSjyr21xMvZwsqho8O/0rvC1rTOuvOjrMSj0M22vLvh0bnL9czY1ffIqNbY0tS077W91+7TxbuvDQorIL3Ptc21xGV0Yb21tc28xsvjy9m2yKOs0OjSqsXkus+4/LjftcRucm91bmRzDQorILXk0M21xMih1rXU2jAuMDEgLSAwLjMNCg0KIyMjIyMgMy5nYW1tYVtkZWZhdWx0PTBdW3JhbmdlOiAoMCxJbmYpXQ0KKyC/2NbG1f3U8ruvo6i3wNa5uf3E4rrPo6mho2dhbW1htcTX7tPFyKHWtcihvvbT2sr9vt28r7rNxuTL+7LOyv3WtQ0KKyDIoda11L2436Os1f3U8ruvwaa2yNS9tPOho9X91PK7r9LizrbXxbbUw7vT0LjEycbEo9DNse3P1sfSyKHWtb3PtPO1xLLOyv3KqdLUs823o6GjxKzIz9a1zqowo6zS4s6218XDu9PQ1f3U8ruvDQorILX3ss68vMfJo7rPyL2rss7K/da1yejOqjCho7zssum9u7Lm0enWpLTtzvPCyqOsyOe5+3RyYWluIGVycm9yID4+PiB0ZXN0IGVycm9yo6zU8tL9yOtnYW1tYbLOyv2ho2dhbW1h1rXUvbjfo6zRtcG3vK+6zbLiytS8r7XEsu6+4NS90KGho8jnufvK97XEye62yKOobWF4X2RlcHRoo6m9z9Cho6xnYW1tYbLOyv274bT4wLTQ1MTctcTM4cn9DQoNCiMjIyMjIDQubWF4X2RlcHRoW2RlZmF1bHQ9Nl1bcmFuZ2U6ICgwLEluZildDQorIL/Y1sbK97XEye62yA0KKyDK97XEye62yNS9tPOjrMSj0M3Uvbi01NOjrLn9xOK6z7XEv8nE3NDU1L2086Gj1eLSu7LOyv3Du9PQserXvMih1rWho7j8tPO1xMr9vt28r6Os0OjSqrj8ye61xMr3ye62yKGjDQorIMq508NDVr340NC197LODQoNCiMjIyMjIDUubWluX2NoaWxkX3dlaWdodFtkZWZhdWx0PTFdW3JhbmdlOigwLEluZildDQorINTau9i56dbQo6y0+rHtw7/Su7j219O92rXj1tC1xNfu0KHR+bG+yv2ho9Tat9bA4NbQo6zI57n719O92rXjtcTR+bG+yKjW2Nauus2jqNPJtv6918artbzK/bzGy+O1w7W9o6nQodPauMOyzsr91rWjrNTyzaPWubfWwdENCg0KIyMjIyMgNi5zdWJzYW1wbGVbZGVmYXVsdD0xXVtyYW5nZTogKDAsMSldDQorIL/Y1sbDv9K7v8PK98q508O1xNH5sb7K/Q0KKyC15NDNtcTIoda11NowLjUgLSAwLjgNCg0KIyMjIyMgNy5jb2xzYW1wbGVfYnl0cmVlW2RlZmF1bHQ9MV1bcmFuZ2U6ICgwLDEpXQ0KKyC/2NbGw7/Su7/DyvfKudPDtcTM2NX3yv0NCisgteTQzbXEyKHWtdTaMC41IC0gMC45DQoNCiMjIyMjIDgubGFtYmRhW2RlZmF1bHQ9MV0NCisgv9jWxkwy1f3U8ruvo6jP4LWx09rB67vYuemjqaOs08PAtLfA1rm5/cTius8NCg0KIyMjIyMgOS5hbHBoYVtkZWZhdWx0PTBdDQorIL/Y1sZMMdX91PK7r6Ooz+C1sdPabGFzc2+72Lnpo6mjrNPDwLS3wNa5uf3E4rrPo6y7ub/J0tTTw8C01/bM2NX30aHU8aOs1Nq4386syv2+3byv1tC4/NPQ08MNCg0KIyMjIyMgMTAuc2NhbGVfcG9zX3dlaWdodFtkZWZhdWx0PTFdDQorIL/Y1sbV/bi6wP21xMio1tijrLbUsrvGvbriyv2+3byv09DTw6Gjv8nS1L+8wsfJ6NbDzqq4usD9yv0v1f3A/cr9oaMNCg0KIyMjIyAyLjIgUGFyYW1ldGVycyBmb3IgTGluZWFyIEJvb3N0ZXINCiMjIyMjIDEubnJvdW5kc1tkZWZhdWx0PTEwMF0NCisgv9jWxtfutPO1/LT6tM7K/Q0KKyDKudPDQ1a9+NDQtfeyzg0KDQojIyMjIyAyLmxhbWJkYVtkZWZhdWx0PTBdDQorIL/Y1sZMMtX91PK7r6OsvLTB67vYuemjrNPDwLS3wNa5uf3E4rrPDQoNCiMjIyMjIDMuYWxwaGFbZGVmYXVsdD0wXQ0KKyC/2NbGTDHV/dTyu6+jrLy0bGFzc2+72Lnpo6zTw8C0t8DWubn9xOK6zw0KDQojIyMjIDIuMyBMZWFybmluZyBUYXNrIFBhcmFtZXRlcnMNCiMjIyMjIDEuT2JqZWN0aXZlW2RlZmF1bHQ9cmVnOmxpbmVhcl0NCisgcmVnOmxpbmVhciAtIGZvciBsaW5lYXIgcmVncmVzc2lvbg0KKyBiaW5hcnk6bG9naXN0aWMgLSBsb2dpc3RpYyByZWdyZXNzaW9uIGZvciBiaW5hcnkgY2xhc3NpZmljYXRpb24uIEl0IHJldHVybnMgY2xhc3MgcHJvYmFiaWxpdGllcw0KKyBtdWx0aTpzb2Z0bWF4IC0gbXVsdGljbGFzc2lmaWNhdGlvbiB1c2luZyBzb2Z0bWF4IG9iamVjdGl2ZS4gSXQgcmV0dXJucyBwcmVkaWN0ZWQgY2xhc3MgbGFiZWxzLiBJdCByZXF1aXJlcyBzZXR0aW5nIG51bV9jbGFzcyBwYXJhbWV0ZXIgZGVub3RpbmcgbnVtYmVyIG9mIHVuaXF1ZSBwcmVkaWN0aW9uIGNsYXNzZXMuDQorIG11bHRpOnNvZnRwcm9iIC0gbXVsdGljbGFzc2lmaWNhdGlvbiB1c2luZyBzb2Z0bWF4IG9iamVjdGl2ZS4gSXQgcmV0dXJucyBwcmVkaWN0ZWQgY2xhc3MgcHJvYmFiaWxpdGllcy4NCg0KIyMjIyMgMi5ldmFsX21ldHJpYyBbbm8gZGVmYXVsdCwgZGVwZW5kcyBvbiBvYmplY3RpdmUgc2VsZWN0ZWRdDQorINPDwLTGwLnAxKPQzdTa0enWpLyvyc+1xNe8yLfCyqGjttTT2rvYuenOyszio6zErMjPxsC829a4serOqlJNU0Wju7bU09q31sDgzsrM4qOsxKzIz8bAvNvWuLHqzqplcnJvcg0KKyC/ydGhtcTGwLzb1rix6sjnz8KjuiAgDQogICAgKyBtYWUgLSBNZWFuIEFic29sdXRlIEVycm9yICh1c2VkIGluIHJlZ3Jlc3Npb24pDQogICAgKyBMb2dsb3NzIC0gTmVnYXRpdmUgbG9nbGlrZWxpaG9vZCAodXNlZCBpbiBjbGFzc2lmaWNhdGlvbikNCiAgICArIEFVQyAtIEFyZWEgdW5kZXIgY3VydmUgKHVzZWQgaW4gY2xhc3NpZmljYXRpb24pDQogICAgKyBSTVNFIC0gUm9vdCBtZWFuIHNxdWFyZSBlcnJvciAodXNlZCBpbiByZWdyZXNzaW9uKQ0KICAgICsgZXJyb3IgLSBCaW5hcnkgY2xhc3NpZmljYXRpb24gZXJyb3IgcmF0ZSBbI3dyb25nIGNhc2VzLyNhbGwgY2FzZXNdDQogICAgKyBtbG9nbG9zcyAtIG11bHRpY2xhc3MgbG9nbG9zcyAodXNlZCBpbiBjbGFzc2lmaWNhdGlvbikNCg0KIyA0IFhHQm9vc3TU2lLW0LXEtfeyzsq1vPkNCrX3ss6y38LUo7ogIA0KDQorIDEu0aHU8b3PuN+1xNGnz7DL2cLKKGV0YSmho9K7sOPH6b/2z8KjrLP1yrzRp8+wy9nCyrXE1rXOqjAuMaGjtavKx6OsttTT2rK7zay1xM7KzOKjrMDtz+u1xNGnz7DL2cLK09DKsbryu+HU2jAuMDW1vTAuM9auvOSyqLavoaPNqLn9eGdiLmN2uq/K/bXEZWFybHlfc3RvcHBpbmdfcm91bmRzss7K/cC0v9jWxtfu08W1xL72st/K98r9wb8obnJvdW5kcymho7bU09q4+LaotcTRp8+wy9nCyqOsvfjQ0L72st/K98zYtqiyzsr9tffTxShtYXhfZGVwdGgsIG1pbl9jaGlsZF93ZWlnaHQsIGdhbW1hLCBzdWJzYW1wbGUsIGNvbHNhbXBsZV9ieXRyZWUpoaMNCg0KKyAyLlhHQm9vc3S1xNX91PK7r7LOyv21xLX308WhoyhsYW1iZGEsIGFscGhhKaGj1eLQqbLOyv2/ydLUvbW1zcSj0M21xLi01NO2yKOstNO2+MzhuN/Eo9DNtcSx7c/WoaMNCg0KKyAzLr3hus9zY2FsZV9wb3Nfd2VpZ2h0ss7K/aOs1NrA+9PDzLDAt8vjt6i199X7tcO1vbXEss7K/dfpus+4vb38o6y21Mv509Cyzsr9vfjQ0M34uPHL0cv3oaMNCg0KKyA0Lr21tc3Rp8+wy9nCyqOsyLe2qMDtz+u1xL72st/K98r9wb8obnJvdWRzKaGjDQoNCiMjIDQuMCBBVUMgJiBLU7qvyv2hos34uPHL0cv3uq/K/aGi1NjI68r9vt28rw0KIyMjIDQuMC4xIEFVQyAmIEtTuq/K/Q0KYGBge3J9DQpsaWJyYXJ5KFJPQ1IpDQpjYWxjX2F1Y19hbmRfa3MgPC0gZnVuY3Rpb24ocHJlZCwgeSkgew0KICBwcmVkLm9iajEgPC0gUk9DUjo6cHJlZGljdGlvbihwcmVkLCB5KQ0KICANCiAgIyMgQVVDDQogIGF1Yy50bXAxIDwtIHBlcmZvcm1hbmNlKHByZWQub2JqMSwgImF1YyIpDQogIGF1YzEgPC0gYXMubnVtZXJpYyhhdWMudG1wMUB5LnZhbHVlcykNCiAgDQogICMjIEtTDQogIHJvYy50bXAxIDwtIHBlcmZvcm1hbmNlKHByZWQub2JqMSwgInRwciIsICJmcHIiKQ0KICBrcyA8LSBtYXgoYXR0cihyb2MudG1wMSwgInkudmFsdWVzIilbWzFdXSAtIGF0dHIocm9jLnRtcDEsICJ4LnZhbHVlcyIpW1sxXV0pDQogIA0KICAjIHByaW50KGMoYXVjMSwga3MpKQ0KICByZXR1cm4obGlzdChhdWMxLCBrcykpDQp9DQoNCiMgZ2V0IGN2LWF1YywgY3Yta3M6DQojIGN2X3ByZWRpY3Rpb24gPC0geGdiJHByZWQNCiMgY2FsY19hdWNfYW5kX2tzKGN2X3ByZWRpY3Rpb24sIHlfdHJhaW4pDQpgYGANCg0KIyMjIDQuMC4yIM34uPHL0cv3uq/K/Q0KYGBge3J9DQpncmlkX3NlYXJjaCA8LSBmdW5jdGlvbihkdHJhaW4sIHlfdHJhaW4sDQogICAgICAgICAgICAgICAgICAgICAgICBzZWVkID0gMTAsIG50aHJlYWQ9MjAsIG1pc3Npbmc9TkEsIG5yb3VuZHM9MTAwMDAsIGVhcmx5X3N0b3BwaW5nX3JvdW5kcz01MCwgbmZvbGQ9NSwgc3RyYXRpZmllZD1ULCB2ZXJib3NlPUYsIHByZWRpY3Rpb24gPSBULA0KICAgICAgICAgICAgICAgICAgICAgICAgZXRhPWMoMC4xKSwNCiAgICAgICAgICAgICAgICAgICAgICAgIG1heF9kZXB0aD1jKDYpLA0KICAgICAgICAgICAgICAgICAgICAgICAgbWluX2NoaWxkX3dlaWdodD1jKDEpLA0KICAgICAgICAgICAgICAgICAgICAgICAgZ2FtbWE9IGMoMCksDQogICAgICAgICAgICAgICAgICAgICAgICBzdWJzYW1wbGU9YygxKSwNCiAgICAgICAgICAgICAgICAgICAgICAgIGNvbHNhbXBsZV9ieXRyZWUgPWMoMSksDQogICAgICAgICAgICAgICAgICAgICAgICBsYW1iZGEgPSBjKDEpLA0KICAgICAgICAgICAgICAgICAgICAgICAgYWxwaGEgPWMoMCksDQogICAgICAgICAgICAgICAgICAgICAgICBzY2FsZV9wb3Nfd2VpZ2h0PWMoMSkpew0KDQogICMgY3JlYXRlIG91dHB1dCBkYXRhLmZyYW1lo7pwYXJhbShzZXAgYnkgLCksIGF1Yywga3MsICBhdWNfcmFuaywga3NfcmFuay4NCiAgb3V0cHV0X2RmIDwtIGRhdGEuZnJhbWUodChyZXAoTkEsMTEpKSkNCiAgbmFtZXMob3V0cHV0X2RmKSA8LSBjKCJldGEiLCAibWF4X2RlcHRoIiwibWluX2NoaWxkX3dlaWdodCIsImdhbW1hIiwic3Vic2FtcGxlIiwiY29sc2FtcGxlX2J5dHJlZSIsImxhbWJkYSIsImFscGhhIiwic2NhbGVfcG9zX3dlaWdodCIsImN2X2F1YyIsImN2X2tzIikNCiAgcm93a2V5IDwtMQ0KICANCiAgIyBjcmVhdGUgcGFyYW1ldGVycyBncmlkDQogIHRvX3R1bmUgPSBleHBhbmQuZ3JpZChldGEgPSBldGEsDQogIAkJCQkJCW1heF9kZXB0aCA9IG1heF9kZXB0aCwNCiAgCQkJCQkJbWluX2NoaWxkX3dlaWdodCA9IG1pbl9jaGlsZF93ZWlnaHQsDQogIAkJCQkJCWdhbW1hID0gZ2FtbWEsDQogIAkJCQkJCXN1YnNhbXBsZSA9IHN1YnNhbXBsZSwNCiAgCQkJCQkJY29sc2FtcGxlX2J5dHJlZSA9IGNvbHNhbXBsZV9ieXRyZWUsDQogIAkJCQkJCWxhbWJkYSA9IGxhbWJkYSwNCiAgCQkJCQkJYWxwaGEgPSBhbHBoYSwNCiAgCQkJCQkJc2NhbGVfcG9zX3dlaWdodCA9IHNjYWxlX3Bvc193ZWlnaHQpDQoNCiAgIyBmb3IgbG9vcA0KICBmb3IgKGkgaW4gc2VxKGRpbSh0b190dW5lKVsxXSkpIHsNCiAgICANCiAgCXhnYl9wYXJhbXMgPSBsaXN0KA0KICAJICBvYmplY3RpdmUgPSAiYmluYXJ5OmxvZ2lzdGljIiwNCiAgCQlldmFsX21ldHJpYyA9ICdhdWMnKQ0KDQogIAkgIHhnYl9wYXJhbXMkZXRhID0gdG9fdHVuZVtpLCAxXQ0KICAJICB4Z2JfcGFyYW1zJG1heF9kZXB0aCA9IHRvX3R1bmVbaSwgMl0NCiAgCSAgeGdiX3BhcmFtcyRtaW5fY2hpbGRfd2VpZ2h0ID0gdG9fdHVuZVtpLCAzXQ0KICAJICB4Z2JfcGFyYW1zJGdhbW1hID0gdG9fdHVuZVtpLCA0XQ0KICAJICB4Z2JfcGFyYW1zJHN1YnNhbXBsZSA9IHRvX3R1bmVbaSwgNV0NCiAgCSAgeGdiX3BhcmFtcyRjb2xzYW1wbGVfYnl0cmVlID0gdG9fdHVuZVtpLCA2XQ0KICAJICB4Z2JfcGFyYW1zJGxhbWJkYSA9IHRvX3R1bmVbaSwgN10NCiAgCSAgeGdiX3BhcmFtcyRhbHBoYSA9IHRvX3R1bmVbaSwgOF0NCiAgCSAgeGdiX3BhcmFtcyRzY2FsZV9wb3Nfd2VpZ2h0ID0gdG9fdHVuZVtpLCA5XQ0KDQogIAlzZXQuc2VlZChzZWVkKQ0KICAJc3RhcnRfdG0gPC1TeXMudGltZSgpDQogIAl4Z2IgPSB4Z2IuY3YoZGF0YSA9IGR0cmFpbiwNCiAgCQkJCSBwYXJhbXMgPSB4Z2JfcGFyYW1zLA0KICAJCQkJIG50aHJlYWQgPSBudGhyZWFkLA0KICAJCQkJIG1pc3NpbmcgPSBtaXNzaW5nLA0KICAJCQkJIG5yb3VuZHMgPSBucm91bmRzLA0KICAJCQkJIGVhcmx5X3N0b3BwaW5nX3JvdW5kcyA9IGVhcmx5X3N0b3BwaW5nX3JvdW5kcywNCiAgCQkJCSBuZm9sZCA9IG5mb2xkLA0KICAJCQkJIHN0cmF0aWZpZWQgPSBzdHJhdGlmaWVkLA0KICAJCQkJIHZlcmJvc2UgPSB2ZXJib3NlLA0KICAJCQkJIHByZWRpY3Rpb24gPSBwcmVkaWN0aW9uDQogIAkJKQ0KICAJZW5kX3RtPC1TeXMudGltZSgpDQojICAJcHJpbnQocGFzdGUwKHJvd2tleSwgJyBydW4gdGltZTonLCBlbmRfdG0gLSBzdGFydF90bSkpDQoNCiAgICAjIGdldCBjdi1hdWMsIGN2LWtzOg0KICAJY3ZfcHJlZGljdGlvbiA8LSB4Z2IkcHJlZA0KICAJbGlzdF9hdWNfa3MgPC0gY2FsY19hdWNfYW5kX2tzKGN2X3ByZWRpY3Rpb24sIHlfdHJhaW4pDQogIAlhdWNfaSA8LSBsaXN0X2F1Y19rc1tbMV1dDQogIAlrc19pIDwtIGxpc3RfYXVjX2tzW1syXV0NCiAgCQ0KICAJIyBmaWxsIHRoZSBvdXRwdXRfZGYNCiAgCW91dHB1dF9kZltyb3drZXksXSA8LSBhcy5kYXRhLmZyYW1lKHQoYyh4Z2JfcGFyYW1zJGV0YSwgeGdiX3BhcmFtcyRtYXhfZGVwdGgsIHhnYl9wYXJhbXMkbWluX2NoaWxkX3dlaWdodCwgeGdiX3BhcmFtcyRnYW1tYSwgeGdiX3BhcmFtcyRzdWJzYW1wbGUsIHhnYl9wYXJhbXMkY29sc2FtcGxlX2J5dHJlZSwgeGdiX3BhcmFtcyRsYW1iZGEsIHhnYl9wYXJhbXMkYWxwaGEsIHhnYl9wYXJhbXMkc2NhbGVfcG9zX3dlaWdodCwgYXVjX2ksIGtzX2kpKSkNCiAgCQ0KIyAgCXByaW50KHBhc3RlMCgicm93a2V5PSIsIHJvd2tleSwgIj4+cGFyYW1ldGVyczpldGE9IiwgeGdiX3BhcmFtcyRldGEsICIsbWF4X2RlcHRoPSIsIHhnYl9wYXJhbXMkbWF4X2RlcHRoLCAiLG1pbl9jaGlsZF93ZWlnaHQ9IiwgeGdiX3BhcmFtcyRtaW5fY2hpbGRfd2VpZ2h0LCAiLGdhbW1hPSIsIHhnYl9wYXJhbXMkZ2FtbWEsICIsc3Vic2FtcGxlPSIsIHhnYl9wYXJhbXMkc3Vic2FtcGxlLCAiLGNvbHNhbXBsZV9ieXRyZWU9IiwgeGdiX3BhcmFtcyRjb2xzYW1wbGVfYnl0cmVlLCAiLGxhbWJkYT0iLCB4Z2JfcGFyYW1zJGxhbWJkYSwgIixhbHBoYT0iLCB4Z2JfcGFyYW1zJGFscGhhLCAiLHNjYWxlX3Bvc193ZWlnaHQ9IiwgeGdiX3BhcmFtcyRzY2FsZV9wb3Nfd2VpZ2h0KSkNCiMgIAlwcmludChwYXN0ZTAoInJvd2tleT0iLHJvd2tleSwiOiBhdWM9IixhdWNfaSwiLCBrcz0iLGtzX2ksIi4uLi4iKSkNCiAgCQ0KICAJcm93a2V5IDwtIHJvd2tleSArMQ0KICB9DQoNCiAgIyBSYW5rIHRoZSBhdWMsIGtzIGRlc2NlbmRpbmcuDQogIG91dHB1dF9kZlssImRlc2NfcmFua19hdWMiXSA8LSBkaW0ob3V0cHV0X2RmKVsxXSsxIC0gYXMuZGF0YS5mcmFtZShyYW5rKG91dHB1dF9kZlssImN2X2F1YyJdKSkNCiAgb3V0cHV0X2RmWywiZGVzY19yYW5rX2tzIl0gPC0gZGltKG91dHB1dF9kZilbMV0rMSAtIGFzLmRhdGEuZnJhbWUocmFuayhvdXRwdXRfZGZbLCJjdl9rcyJdKSkNCg0KICByZXR1cm4ob3V0cHV0X2RmKQ0KfQ0KYGBgDQoNCiMjIyA0LjAuMyDU2Mjryv2+3byvDQpgYGB7cn0NCmxpYnJhcnkoeGdib29zdCkNCmxpYnJhcnkoZHBseXIpDQoNCmRmX3RyYWluID0gcmVhZC5jc3YoImRhdGEvY3MtdHJhaW5pbmcuY3N2Iiwgc3RyaW5nc0FzRmFjdG9ycyA9IEZBTFNFKSAlPiUNCiAgbmEub21pdCgpICU+JSAgIyBkZWxldGUgdGhlIG1pc3NpbmcgdmFsdWUNCiAgc2VsZWN0KC1gWGApICAgIyBkZWxldGUgdGhlIGZpcnN0IGluZGV4IGNvbHVtbg0KDQp0cmFpbl9kYXRhID0gYXMubWF0cml4KGRmX3RyYWluICU+JSBzZWxlY3QoLVNlcmlvdXNEbHFpbjJ5cnMpKQ0KdHJhaW5fbGFiZWwgPSBkZl90cmFpbiRTZXJpb3VzRGxxaW4yeXJzDQoNCmR0cmFpbiA8LSB4Z2IuRE1hdHJpeChkYXRhID0gdHJhaW5fZGF0YSwgbGFiZWwgPSB0cmFpbl9sYWJlbCkNCmBgYA0KDQoNCiMjIDQuMSDU2r3PuN+1xNGnz7DL2cLKz8KjrL340NC+9rLfyveyzsr9tffTxQ0KbWF4X2RlcHRoIKGiIG1pbl9jaGlsZF93ZWlnaHQgoaIgZ2FtbWEgoaIgc3Vic2FtcGxlIKGiIGNvbHNhbXBsZV9ieXRyZWUNCg0KIyMjIDQuMS4xIG1heF9kZXB0aCC6zSBtaW5fY2hpbGRfd2VpZ2h0ILLOyv2199PFDQrPyLTzt7bOp7XYtNa197LOyv2jrMi7uvPU2dCht7bOp7XYzqK196O7yKG+9tPau/rG97XE0NTE3KOsv8nS1MrKtbG3xb/tzfi48cvRy/e1xLe2zqehorz1ydmyzsr9tcSyvbOkoaMNCg0KYGBge3J9DQpncmlkX3NlYXJjaF9yZXN1bHQgPSBncmlkX3NlYXJjaChkdHJhaW4sIHRyYWluX2xhYmVsLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgZXRhID0gYygwLjEpLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgbWF4X2RlcHRoID0gYygzLCA1LCA3LCA5KSwgICAgICAgICAgIyBpbml0aWFsIHZhbHVlIHNldCB0byBbMy05XQ0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgbWluX2NoaWxkX3dlaWdodCA9IGMoMSwgMywgNSksICAgICAgIyBpbml0aWFsIHZhbHVlIHNldCB0byBbMS01XQ0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgZ2FtbWEgPSBjKDApLCAgICAgICAgICAgICAgICAgICAgICAgIyBpbml0aWFsIHZhbHVlIHNldCB0byAwDQogICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICBzdWJzYW1wbGUgPSBjKDAuOCksICAgICAgICAgICAgICAgICAjIHR5cGljYWwgaW5pdGlhbCB2YWx1ZSBzZXQgdG8gMC44LCBjYW4gYmUgc2V0IHRvIFswLjUsIDAuOV0NCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIGNvbHNhbXBsZV9ieXRyZWUgPSBjKDAuOCksICAgICAgICAgICMgdHlwaWNhbCBpbml0aWFsIHZhbHVlIHNldCB0byAwLjgsIGNhbiBiZSBzZXQgdG8gWzAuNSwgMC45XQ0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgbGFtYmRhID0gYygxKSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIGFscGhhID0gYygwKSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIHNjYWxlX3Bvc193ZWlnaHQgPSBjKDEpKQ0KYGBgDQoNCmBgYHtyfQ0KZ3JpZF9zZWFyY2hfcmVzdWx0DQpgYGANCg0KYGBge3J9DQpncmlkX3NlYXJjaF9yZXN1bHQgJT4lIGZpbHRlcihkZXNjX3JhbmtfYXVjID09IDEpDQpgYGANCg0KwO3P67XEbWF4X2RlcHRo1rXOqjOjrMDtz+u1xG1pbl9jaGlsZF93ZWlnaHTWtc6qNaOstavKx87Sw8e7ucO7s6LK1Ln90KHT2jO1xG1heF9kZXB0aMih1rW6zbTz09o1tcRtaW5fY2hpbGRfd2VpZ2h0yKHWtaOsy/nS1LzM0PjU2tXi0ruyzsr91+m6z7i9vfzL0cv3oaMNCg0KYGBge3J9DQpncmlkX3NlYXJjaF9yZXN1bHQgPSBncmlkX3NlYXJjaChkdHJhaW4sIHRyYWluX2xhYmVsLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgZXRhID0gYygwLjEpLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgbWF4X2RlcHRoID0gYygxLCAzLCA1KSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIG1pbl9jaGlsZF93ZWlnaHQgPSBjKDMsIDUsIDcpLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgZ2FtbWEgPSBjKDApLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgc3Vic2FtcGxlID0gYygwLjgpLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgY29sc2FtcGxlX2J5dHJlZSA9IGMoMC44KSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIGxhbWJkYSA9IGMoMSksDQogICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICBhbHBoYSA9IGMoMCksDQogICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICBzY2FsZV9wb3Nfd2VpZ2h0ID0gYygxKSkNCg0KZ3JpZF9zZWFyY2hfcmVzdWx0DQpgYGANCg0KYGBge3J9DQpncmlkX3NlYXJjaF9yZXN1bHQgJT4lIGZpbHRlcihkZXNjX3JhbmtfYXVjID09IDEpDQpgYGANCg0KwO3P67XEbWF4X2RlcHRo1rXI1Mi7zqozo6zA7c/rtcRtaW5fY2hpbGRfd2VpZ2h01rXI1Mi7zqo1oaPU2tXiuPayzsr91+m6z7i9vfy9+NK7sr2199X7o6y9q7K9s6TJ6NbDzqoxo6zRsNXSwO3P67XEss7K/dfpus+how0KDQpgYGB7cn0NCmdyaWRfc2VhcmNoX3Jlc3VsdCA9IGdyaWRfc2VhcmNoKGR0cmFpbiwgdHJhaW5fbGFiZWwsDQogICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICBldGEgPSBjKDAuMSksDQogICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICBtYXhfZGVwdGggPSBjKDIsIDMsIDQpLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgbWluX2NoaWxkX3dlaWdodCA9IGMoNCwgNSwgNiksDQogICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICBnYW1tYSA9IGMoMCksDQogICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICBzdWJzYW1wbGUgPSBjKDAuOCksDQogICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICBjb2xzYW1wbGVfYnl0cmVlID0gYygwLjgpLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgbGFtYmRhID0gYygxKSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIGFscGhhID0gYygwKSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIHNjYWxlX3Bvc193ZWlnaHQgPSBjKDEpKQ0KDQpncmlkX3NlYXJjaF9yZXN1bHQNCmBgYA0KDQpgYGB7cn0NCmdyaWRfc2VhcmNoX3Jlc3VsdCAlPiUgZmlsdGVyKGRlc2NfcmFua19hdWMgPT0gMSkNCmBgYA0KDQrX7tbVyLe2qKOswO3P67XEbWF4X2RlcHRo1rXOqjOjrMDtz+u1xG1pbl9jaGlsZF93ZWlnaHTWtc6qNaGjDQoNCiMjIyA0LjEuMiBnYW1tYSCyzsr9tffTxQ0KKyBHYW1tYSBUdW5pbmcNCiAgICArIEFsd2F5cyBzdGFydCB3aXRoIDAsIHVzZSB4Z2IuY3YsIGFuZCBsb29rIGhvdyB0aGUgdHJhaW4vdGVzdCBhcmUgZmFyaW5nLiBJZiB5b3UgdHJhaW4gQ1Ygc2t5cm9ja2V0aW5nIG92ZXIgdGVzdCBDViBhdCBhIGJsYXppbmcgc3BlZWQsIHRoaXMgaXMgd2hlcmUgR2FtbWEgaXMgdXNlZnVsIGluc3RlYWQgb2YgbWluX2NoaWxkX3dlaWdodCAoYmVjYXVzZSB5b3UgbmVlZCB0byBjb250cm9sIHRoZSBjb21wbGV4aXR5IGlzc3VlZCBmcm9tIHRoZSBsb3NzLCBub3QgdGhlIGxvc3MgZGVyaXZhdGl2ZSBmcm9tIHRoZSBoZXNzaWFuIHdlaWdodCBpbiBtaW5fY2hpbGRfd2VpZ2h0KS4gQW5vdGhlciBjaG9pY2UgdHlwaWNhbCBhbmQgbW9zdCBwcmVmZXJyZWQgY2hvaWNlOiBzdGVwIG1heF9kZXB0aCBkb3duLg0KICAgICsgSWYgR2FtbWEgaXMgdXNlZnVsIChpLmUgdHJhaW4gQ1Ygc2t5cm9ja2V0cyBhdCBnb2RsaWtlIHNwZWVkIHdoZW4gdGVzdCBDViBjYW6hr3QgZm9sbG93KSwgY3JhbmsgdXAgR2FtbWEuIFRoaXMgaXMgd2hlcmUgdGhlIGV4cGVyaWVuY2Ugd2l0aCB0dW5pbmcgR2FtbWEgaXMgdXNlZnVsIChzbyB5b3UgbG9zZSB0aGUgbG93ZXN0IGFtb3VudCBvZiB0aW1lKS4gRGVwZW5kaW5nIG9uIHdoYXQgeW91IHNlZSBiZXR3ZWVuIHRoZSB0cmFpbi90ZXN0IENWIGluY3JlYXNlIHNwZWVkLCB5b3UgdHJ5IHRvIGZpbmQgYW4gYXBwcm9wcmlhdGUgR2FtbWEuIFRoZSBoaWdoZXIgdGhlIEdhbW1hLCB0aGUgbG93ZXIgdGhlIGRpZmZlcmVuY2UgYmV0d2VlbiB0cmFpbi90ZXN0IENWIHdpbGwgaGFwcGVuLiBJZiB5b3UgaGF2ZSBubyBpZGVhIG9mIHRoZSB2YWx1ZSB0byB1c2UsIHB1dCAxMCBhbmQgbG9vayB3aGF0IGhhcHBlbnMuDQoNCisgSG93IHRvIHNldCBHYW1tYSB2YWx1ZXM/DQogICAgKyBJZiB5b3VyIHRyYWluL3Rlc3QgQ1YgYXJlIGFsd2F5cyBseWluZyB0b28gY2xvc2UsIGl0IG1lYW5zIHlvdSBjb250cm9sbGVkIHdheSB0b28gbXVjaCB0aGUgY29tcGxleGl0eSBvZiB4Z2Jvb3N0LCBhbmQgdGhlIG1vZGVsIGNhbqGvdCBncm93IHRyZWVzIHdpdGhvdXQgcHJ1bmluZyB0aGVtIChkdWUgdG8gdGhlIGxvc3MgdGhyZXNob2xkIG5vdCByZWFjaGVkIHRoYW5rcyB0byBHYW1tYSkuIExvd2VyIEdhbW1hIChnb29kIHJlbGF0aXZlIHZhbHVlIHRvIHJlZHVjZSBpZiB5b3UgZG9uoa90IGtub3c6IGN1dCAyMCUgb2YgR2FtbWEgYXdheSB1bnRpbCB5b3UgdGVzdCBDViBncm93cyB3aXRob3V0IGhhdmluZyB0aGUgdHJhaW4gQ1YgZnJvemVuKS4NCiAgICArIElmIHlvdXIgdHJhaW4vdGVzdCBDViBhcmUgZGlmZmVyaW5nIHRvbyBtdWNoLCBpdCBtZWFucyB5b3UgZGlkIG5vdCBjb250cm9sIGVub3VnaCB0aGUgY29tcGxleGl0eSBvZiB4Z2Jvb3N0LCBhbmQgdGhlIG1vZGVsIGdyb3dzIHRvbyBtYW55IHRyZWVzIHdpdGhvdXQgcHJ1bmluZyB0aGVtIChkdWUgdG8gdGhlIGxvc3MgdGhyZXNob2xkIG5vdCByZWFjaGVkIGJlY2F1c2Ugb2YgR2FtbWEpLiBQdXQgYSBoaWdoZXIgR2FtbWEgKGdvb2QgYWJzb2x1dGUgdmFsdWUgdG8gdXNlIGlmIHlvdSBkb26hr3Qga25vdzogKzIsIHVudGlsIHlvdXIgdGVzdCBDViBjYW4gZm9sbG93IGZhc3RlciB5b3VyIHRyYWluIENWIHdoaWNoIGdvZXMgc2xvd2VyLCB5b3VyIHRlc3QgQ1Ygc2hvdWxkIGJlIGFibGUgdG8gcGVhaykuDQogICAgKyBJZiB5b3VyIHRyYWluIENWIGlzIHN0dWNrIChub3QgaW5jcmVhc2luZywgb3IgaW5jcmVhc2luZyB3YXkgdG9vIHNsb3dseSksIGRlY3JlYXNlIEdhbW1hOiB0aGF0IHZhbHVlIHdhcyB0b28gaGlnaCBhbmQgeGdib29zdCBrZWVwcyBwcnVuaW5nIHRyZWVzIHVudGlsIGl0IGNhbiBmaW5kIHNvbWV0aGluZyBhcHByb3ByaWF0ZSAob3IgaXQgbWF5IGVuZCBpbiBhbiBlbmRsZXNzIGxvb3Agb2YgdGVzdGluZyArIGFkZGluZyBub2RlcyBidXQgcHJ1bmluZyB0aGVtIHN0cmFpZ2h0IGF3YXmhrSkuDQoNCmBgYHtyfQ0KZ3JpZF9zZWFyY2hfcmVzdWx0ID0gZ3JpZF9zZWFyY2goZHRyYWluLCB0cmFpbl9sYWJlbCwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIGV0YSA9IGMoMC4xKSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIG1heF9kZXB0aCA9IGMoMyksDQogICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICBtaW5fY2hpbGRfd2VpZ2h0ID0gYyg1KSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIGdhbW1hID0gYygwLCAwLjAxLCAwLjEsIDEsIDMsIDUsIDEwLCAyMCksDQogICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICBzdWJzYW1wbGUgPSBjKDAuOCksDQogICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICBjb2xzYW1wbGVfYnl0cmVlID0gYygwLjgpLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgbGFtYmRhID0gYygxKSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIGFscGhhID0gYygwKSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIHNjYWxlX3Bvc193ZWlnaHQgPSBjKDEpKQ0KDQpncmlkX3NlYXJjaF9yZXN1bHQNCmBgYA0KDQpgYGB7cn0NCmdyaWRfc2VhcmNoX3Jlc3VsdCAlPiUgZmlsdGVyKGRlc2NfcmFua19hdWMgPT0gMSkNCmBgYA0KDQrA7c/rtcRnYW1tYda1zqozo6zU2tXi0ruyzsr91rW4vb38vfjSu7K9tffV+6OsvauyvbOkyejWw86qMaGjDQoNCmBgYHtyfQ0KZ3JpZF9zZWFyY2hfcmVzdWx0ID0gZ3JpZF9zZWFyY2goZHRyYWluLCB0cmFpbl9sYWJlbCwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIGV0YSA9IGMoMC4xKSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIG1heF9kZXB0aCA9IGMoMyksDQogICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICBtaW5fY2hpbGRfd2VpZ2h0ID0gYyg1KSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIGdhbW1hID0gYygyLCAzLCA0KSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIHN1YnNhbXBsZSA9IGMoMC44KSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIGNvbHNhbXBsZV9ieXRyZWUgPSBjKDAuOCksDQogICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICBsYW1iZGEgPSBjKDEpLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgYWxwaGEgPSBjKDApLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgc2NhbGVfcG9zX3dlaWdodCA9IGMoMSkpDQoNCmdyaWRfc2VhcmNoX3Jlc3VsdA0KYGBgDQoNCmBgYHtyfQ0KZ3JpZF9zZWFyY2hfcmVzdWx0ICU+JSBmaWx0ZXIoZGVzY19yYW5rX2F1YyA9PSAxKQ0KYGBgDQoNCtfu1tXIt7aoo6zA7c/rtcRnYW1tYda1zqozoaMNCg0KIyMjIDQuMS4zIHN1YnNhbXBsZSC6zSBjb2xzYW1wbGVfYnl0cmVlILLOyv2199PFDQrU2jAuNrW9MS4wtcS3ts6nxNqjrNLUMC4xzqqyvbOko6y21HN1YnNhbXBsZbrNY29sc2FtcGxlX2J5dHJlZb340NDN+Ljxy9HL96GjDQoNCmBgYHtyfQ0KZ3JpZF9zZWFyY2hfcmVzdWx0ID0gZ3JpZF9zZWFyY2goZHRyYWluLCB0cmFpbl9sYWJlbCwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIGV0YSA9IGMoMC4xKSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIG1heF9kZXB0aCA9IGMoMyksDQogICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICBtaW5fY2hpbGRfd2VpZ2h0ID0gYyg1KSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIGdhbW1hID0gYygzKSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIHN1YnNhbXBsZSA9IGMoMC42LCAwLjcsIDAuOCwgMC45LCAxLjApLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgY29sc2FtcGxlX2J5dHJlZSA9IGMoMC42LCAwLjcsIDAuOCwgMC45LCAxLjApLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgbGFtYmRhID0gYygxKSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIGFscGhhID0gYygwKSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIHNjYWxlX3Bvc193ZWlnaHQgPSBjKDEpKQ0KDQpncmlkX3NlYXJjaF9yZXN1bHQNCmBgYA0KDQpgYGB7cn0NCmdyaWRfc2VhcmNoX3Jlc3VsdCAlPiUgZmlsdGVyKGRlc2NfcmFua19hdWMgPT0gMSkNCmBgYA0KDQrX7tbVyLe2qKOswO3P67XEc3Vic2FtcGxl1rXOqjAuN6OswO3P67XEY29sc2FtcGxlX2J5dHJlZda1zqowLjihow0KDQojIyMgNC4yIGFscGhhILrNIGxhbWJkYSDV/dTyu6+yzsr9tffTxQ0KZ2FtbWGyzsr9zOG5qcHL0rvW1rj8vNPT0NCntdi9tbXNuf3E4rrPtcS3vbeoo6zS8rb4YWxwaGG6zWxhbWJkYbLOyv2/ydLUvfjQ0L3Pzqq01sLUtcS199X7oaMNCg0KYGBge3J9DQpncmlkX3NlYXJjaF9yZXN1bHQgPSBncmlkX3NlYXJjaChkdHJhaW4sIHRyYWluX2xhYmVsLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgZXRhID0gYygwLjEpLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgbWF4X2RlcHRoID0gYygzKSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIG1pbl9jaGlsZF93ZWlnaHQgPSBjKDUpLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgZ2FtbWEgPSBjKDMpLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgc3Vic2FtcGxlID0gYygwLjcpLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgY29sc2FtcGxlX2J5dHJlZSA9IGMoMC44KSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIGxhbWJkYSA9IGMoMCwgMWUtNSwgMWUtMiwgMC4xLCAxLCAxMDApLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgYWxwaGEgPSBjKDAsIDFlLTUsIDFlLTIsIDAuMSwgMSwgMTAwKSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIHNjYWxlX3Bvc193ZWlnaHQgPSBjKDEpKQ0KDQpncmlkX3NlYXJjaF9yZXN1bHQNCmBgYA0KDQpgYGB7cn0NCmdyaWRfc2VhcmNoX3Jlc3VsdCAlPiUgZmlsdGVyKGRlc2NfcmFua19hdWMgPT0gMSkNCmBgYA0KDQrA7c/rtcRsYW1iZGHWtc6qMTAwo6zA7c/rtcRhbHBoYda1zqoxoaMNCg0KIyMjIDQuMyC21Mv509Cyzsr9vfjQ0M34uPHL0cv3DQoNCmBgYHtyfQ0Kd2VpZ2h0ID0gKGxlbmd0aCh0cmFpbl9sYWJlbCkgLSBzdW0odHJhaW5fbGFiZWwpKSAvIHN1bSh0cmFpbl9sYWJlbCkNCg0KZ3JpZF9zZWFyY2hfcmVzdWx0ID0gZ3JpZF9zZWFyY2goZHRyYWluLCB0cmFpbl9sYWJlbCwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIGV0YSA9IGMoMC4xKSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIG1heF9kZXB0aCA9IGMoMiwgMywgNCksDQogICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICBtaW5fY2hpbGRfd2VpZ2h0ID0gYyg0LCA1LCA2KSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIGdhbW1hID0gYygyLCAzLCA0KSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIHN1YnNhbXBsZSA9IGMoMC42LCAwLjcsIDAuOCksDQogICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICBjb2xzYW1wbGVfYnl0cmVlID0gYygwLjcsIDAuOCwgMC45KSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIGxhbWJkYSA9IGMoMTAwKSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIGFscGhhID0gYygxKSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIHNjYWxlX3Bvc193ZWlnaHQgPSBjKDEsIHdlaWdodCkpDQpgYGANCg0KYGBge3J9DQpncmlkX3NlYXJjaF9yZXN1bHQNCmBgYA0KDQpgYGB7cn0NCmdyaWRfc2VhcmNoX3Jlc3VsdCAlPiUgZmlsdGVyKGRlc2NfcmFua19hdWMgPT0gMSkNCmBgYA0KDQrX7tbVyLe2qKOsbWF4X2RlcHRo1rXOqjSjrG1pbl9jaGlsZF93ZWlnaHTWtc6qNaOsZ2FtbWHWtc6qM6Osc3Vic2FtcGxl1rXOqjAuNqOsY29sc2FtcGxlX2J5dHJlZc6qMC45o6xsYW1iZGHWtc6qMTAwo6xhbHBoYda1zqoxo6xzY2FsZV9wb3Nfd2VpZ2h01rXOqjGhow0KDQojIyMgNC40IL21tc3Rp8+wy9nCyg0K0tQxMLG2zqqyvbOkvbW1zdGnz7DL2cLKo6xldGG31rHwyKEwLjAwMaOsIDAuMDGjrCAwLjG9+NDQy9HL96GjDQoNCmBgYHtyfQ0KZ3JpZF9zZWFyY2hfcmVzdWx0ID0gZ3JpZF9zZWFyY2goZHRyYWluLCB0cmFpbl9sYWJlbCwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIGV0YSA9IGMoMC4wMDEsIDAuMDEsIDAuMSksDQogICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICBtYXhfZGVwdGggPSBjKDQpLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgbWluX2NoaWxkX3dlaWdodCA9IGMoNSksDQogICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICBnYW1tYSA9IGMoMyksDQogICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICBzdWJzYW1wbGUgPSBjKDAuNiksDQogICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICBjb2xzYW1wbGVfYnl0cmVlID0gYygwLjkpLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgbGFtYmRhID0gYygxMDApLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgYWxwaGEgPSBjKDEpLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgc2NhbGVfcG9zX3dlaWdodCA9IGMoMSkpDQoNCmdyaWRfc2VhcmNoX3Jlc3VsdA0KYGBgDQoNCmBgYHtyfQ0KZ3JpZF9zZWFyY2hfcmVzdWx0ICU+JSBmaWx0ZXIoZGVzY19yYW5rX2F1YyA9PSAxKQ0KYGBgDQoNCsDtz+u1xGV0Yda1zqowLjGjrNTaMC4xuL29/NLUMC4wNc6qsr2zpL340ruyvcvRy/ehow0KDQpgYGB7cn0NCmdyaWRfc2VhcmNoX3Jlc3VsdCA9IGdyaWRfc2VhcmNoKGR0cmFpbiwgdHJhaW5fbGFiZWwsDQogICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICBldGEgPSBjKDAuMDUsIDAuMSwgMC4xNSwgMC4yLCAwLjI1LCAwLjMpLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgbWF4X2RlcHRoID0gYyg0KSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIG1pbl9jaGlsZF93ZWlnaHQgPSBjKDUpLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgZ2FtbWEgPSBjKDMpLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgc3Vic2FtcGxlID0gYygwLjYpLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgY29sc2FtcGxlX2J5dHJlZSA9IGMoMC45KSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIGxhbWJkYSA9IGMoMTAwKSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIGFscGhhID0gYygxKSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIHNjYWxlX3Bvc193ZWlnaHQgPSBjKDEpKQ0KDQpncmlkX3NlYXJjaF9yZXN1bHQNCmBgYA0KDQpgYGB7cn0NCmdyaWRfc2VhcmNoX3Jlc3VsdCAlPiUgZmlsdGVyKGRlc2NfcmFua19hdWMgPT0gMSkNCmBgYA0KDQrA7c/rtcRldGHWtc6qMC4xo6zU2jAuMbi9vfzS1DAuMDHOqrK9s6S9+NK7sr3L0cv3oaMNCg0KYGBge3J9DQpncmlkX3NlYXJjaF9yZXN1bHQgPSBncmlkX3NlYXJjaChkdHJhaW4sIHRyYWluX2xhYmVsLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgZXRhID0gYygwLjA2LCAwLjA3LCAwLjA4LCAwLjA5LCAwLjEsIDAuMTEsIDAuMTIsIDAuMTMsIDAuMTQpLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgbWF4X2RlcHRoID0gYyg0KSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIG1pbl9jaGlsZF93ZWlnaHQgPSBjKDUpLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgZ2FtbWEgPSBjKDMpLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgc3Vic2FtcGxlID0gYygwLjYpLA0KICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgY29sc2FtcGxlX2J5dHJlZSA9IGMoMC45KSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIGxhbWJkYSA9IGMoMTAwKSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIGFscGhhID0gYygxKSwNCiAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIHNjYWxlX3Bvc193ZWlnaHQgPSBjKDEpKQ0KDQpncmlkX3NlYXJjaF9yZXN1bHQNCmBgYA0KDQpgYGB7cn0NCmdyaWRfc2VhcmNoX3Jlc3VsdCAlPiUgZmlsdGVyKGRlc2NfcmFua19hdWMgPT0gMSkNCmBgYA0KDQrX7tbVyLe2qKOswO3P67XEZXRh1rXOqjAuMTKjrNXSs/a4w2V0Yda1z8K21NOmtcRucm91bmRzoaMNCg0KYGBge3J9DQpic3RfcGFyYW1zID0gbGlzdChvYmplY3RpdmUgPSAiYmluYXJ5OmxvZ2lzdGljIiwNCiAgICAgICAgICAgICAgICAgIGV2YWxfbWV0cmljID0gJ2F1YycsDQogICAgICAgICAgICAgICAgICBldGEgPSBjKDAuMTIpLA0KICAgICAgICAgICAgICAgICAgbWF4X2RlcHRoID0gYyg0KSwNCiAgICAgICAgICAgICAgICAgIG1pbl9jaGlsZF93ZWlnaHQgPSBjKDUpLA0KICAgICAgICAgICAgICAgICAgZ2FtbWEgPSBjKDMpLA0KICAgICAgICAgICAgICAgICAgc3Vic2FtcGxlID0gYygwLjYpLA0KICAgICAgICAgICAgICAgICAgY29sc2FtcGxlX2J5dHJlZSA9IGMoMC45KSwNCiAgICAgICAgICAgICAgICAgIGxhbWJkYSA9IGMoMTAwKSwNCiAgICAgICAgICAgICAgICAgIGFscGhhID0gYygxKSwNCiAgICAgICAgICAgICAgICAgIHNjYWxlX3Bvc193ZWlnaHQgPSBjKDEpKQ0KDQpzZXQuc2VlZCgxMCkNCmJzdF9jdiA9IHhnYi5jdihkYXRhID0gZHRyYWluLA0KCQkgICAgICAJIHBhcmFtcyA9IGJzdF9wYXJhbXMsDQoJCSAgICAgICMJIG50aHJlYWQgPSAyMCwNCgkJICAgICAgCSBtaXNzaW5nID0gTkEsDQoJCSAgICAgIAkgbnJvdW5kcyA9IDEwMDAwLA0KCQkgICAgICAJIGVhcmx5X3N0b3BwaW5nX3JvdW5kcyA9IDUwLA0KCQkgICAgICAJIG5mb2xkID0gNSwNCgkJICAgICAgCSBzdHJhdGlmaWVkID0gVCwNCgkJICAgICAgCSB2ZXJib3NlID0gRiwNCgkJICAgICAgCSBwcmVkaWN0aW9uID0gVA0KCSkNCmBgYA0KDQpgYGB7cn0NCmNhbGNfYXVjX2FuZF9rcyhic3RfY3YkcHJlZCwgdHJhaW5fbGFiZWwpDQpgYGANCg0KYGBge3J9DQpic3RfY3Ykbml0ZXINCmBgYA0KDQpgYGB7cn0NCmJzdF9jdiRldmFsdWF0aW9uX2xvZw0KYGBgDQoNCtfu1tXIt7aoo6zA7c/rtcRucm91bmRzzqoyNTmhow0KDQojIDUgxKPQzcbAvNsNCiMjIDUuMSC77M/9vtjV8w0KYGBge3J9DQpsaWJyYXJ5KGNhcmV0KQ0KbGlicmFyeShlMTA3MSkNCg0KeGdiX3ByZWQgPC0gaWZlbHNlKGJzdF9jdiRwcmVkID4gMC41LCAxLCAwKQ0KDQpjb25mdXNpb25NYXRyaXgoeGdiX3ByZWQsIHRyYWluX2xhYmVsKQ0KYGBgDQoNCiMjIDUuMiBST0PH+s/fDQpgYGB7cn0NCmxpYnJhcnkocFJPQykNCm1vZGVscm9jID0gcm9jKHRyYWluX2xhYmVsLCBic3RfY3YkcHJlZCwgdGhyZXNob2xkcyA9IDAuNSkNCnBsb3QobW9kZWxyb2MscHJpbnQuYXVjPVQsYXVjLnBvbHlnb249VCxncmlkPWMoMC4xLDAuMiksZ3JpZC5jb2w9YygnZ3JlZW4nLCdyZWQnKSxtYXguYXVjLnBvbHlnb249VCxhdWMucG9seWdvbi5jb2w9J3NreWJsdWUnLHByaW50LnRocmVzPVQpDQpgYGANCg0KIyMgNS4zING1wbe8r6Givbuy5tHp1qRBVUPP4LbU09rRtcG3wta0zrXEseS7r8f3ysbNvA0KYGBge3J9DQpsaWJyYXJ5KHRpZHlyKQ0KYnN0X2N2JGV2YWx1YXRpb25fbG9nICU+JQ0KICBzZWxlY3QoLWNvbnRhaW5zKCJzdGQiKSkgJT4lDQogIGdhdGhlcihUZXN0T3JUcmFpbiwgQVVDLCAtaXRlcikgJT4lDQogIGdncGxvdChhZXMoeCA9IGl0ZXIsIHkgPSBBVUMsIGdyb3VwID0gVGVzdE9yVHJhaW4sIGNvbG9yID0gVGVzdE9yVHJhaW4pKSArIA0KICBnZW9tX2xpbmUoKSArIA0KICB0aGVtZV9idygpDQpgYGANCg0KIyA2IMSj0M3RtcG3DQrKudPDtfe92rrDtcSyzsr9o6zNqLn9eGdiLnRyYWluo6zU2sirsr/RtcG3vK/Jz9G1wbfEo9DNDQpgYGB7cn0NCmJzdF9wYXJhbXMgPSBsaXN0KG9iamVjdGl2ZSA9ICJiaW5hcnk6bG9naXN0aWMiLA0KICAgICAgICAgICAgICAgICAgZXZhbF9tZXRyaWMgPSAnYXVjJywNCiAgICAgICAgICAgICAgICAgIGV0YSA9IGMoMC4xMiksDQogICAgICAgICAgICAgICAgICBtYXhfZGVwdGggPSBjKDQpLA0KICAgICAgICAgICAgICAgICAgbWluX2NoaWxkX3dlaWdodCA9IGMoNSksDQogICAgICAgICAgICAgICAgICBnYW1tYSA9IGMoMyksDQogICAgICAgICAgICAgICAgICBzdWJzYW1wbGUgPSBjKDAuNiksDQogICAgICAgICAgICAgICAgICBjb2xzYW1wbGVfYnl0cmVlID0gYygwLjkpLA0KICAgICAgICAgICAgICAgICAgbGFtYmRhID0gYygxMDApLA0KICAgICAgICAgICAgICAgICAgYWxwaGEgPSBjKDEpLA0KICAgICAgICAgICAgICAgICAgc2NhbGVfcG9zX3dlaWdodCA9IGMoMSkpDQoNCmJzdCA9IHhnYi50cmFpbihkYXRhID0gZHRyYWluLA0KICAgICAgICAgICAgICAgIHBhcmFtcyA9IGJzdF9wYXJhbXMsDQogICAgICAgICAgICAgICAgIyBudGhyZWFkID0gMjAsDQogICAgICAgICAgICAgICAgbWlzc2luZyA9IE5BLA0KICAgICAgICAgICAgICAgIG5yb3VuZHMgPSAyNTksDQogICAgICAgICAgICAgICAgdmVyYm9zZSA9IEYNCiAgICAgICAgICAgICAgICApDQpgYGANCg0KIyA3IMSj0M2/yb3iys3Q1A0KIyMgNy4xIMzY1ffW2NKq0NTGwLfWDQpgYGB7cn0NCmltcG9ydGFuY2UgPC0geGdiLmltcG9ydGFuY2UoZmVhdHVyZV9uYW1lcyA9IGNvbG5hbWVzKHRyYWluX2RhdGEpLCBtb2RlbCA9IGJzdCkNCmltcG9ydGFuY2UNCmBgYA0KDQorIEdhaW4gaXMgdGhlIGltcHJvdmVtZW50IGluIGFjY3VyYWN5IGJyb3VnaHQgYnkgYSBmZWF0dXJlIHRvIHRoZSBicmFuY2hlcyBpdCBpcyBvbi4gVGhlIGlkZWEgaXMgdGhhdCBiZWZvcmUgYWRkaW5nIGEgbmV3IHNwbGl0IG9uIGEgZmVhdHVyZSBYIHRvIHRoZSBicmFuY2ggdGhlcmUgd2FzIHNvbWUgd3JvbmdseSBjbGFzc2lmaWVkIGVsZW1lbnRzLCBhZnRlciBhZGRpbmcgdGhlIHNwbGl0IG9uIHRoaXMgZmVhdHVyZSwgdGhlcmUgYXJlIHR3byBuZXcgYnJhbmNoZXMsIGFuZCBlYWNoIG9mIHRoZXNlIGJyYW5jaCBpcyBtb3JlIGFjY3VyYXRlIChvbmUgYnJhbmNoIHNheWluZyBpZiB5b3VyIG9ic2VydmF0aW9uIGlzIG9uIHRoaXMgYnJhbmNoIHRoZW4gaXQgc2hvdWxkIGJlIGNsYXNzaWZpZWQgYXMgMSwgYW5kIHRoZSBvdGhlciBicmFuY2ggc2F5aW5nIHRoZSBleGFjdCBvcHBvc2l0ZSkuDQoNCisgQ292ZXIgbWVhc3VyZXMgdGhlIHJlbGF0aXZlIHF1YW50aXR5IG9mIG9ic2VydmF0aW9ucyBjb25jZXJuZWQgYnkgYSBmZWF0dXJlLg0KDQorIEZyZXF1ZW5jeSBpcyBhIHNpbXBsZXIgd2F5IHRvIG1lYXN1cmUgdGhlIEdhaW4uIEl0IGp1c3QgY291bnRzIHRoZSBudW1iZXIgb2YgdGltZXMgYSBmZWF0dXJlIGlzIHVzZWQgaW4gYWxsIGdlbmVyYXRlZCB0cmVlcy4gWW91IHNob3VsZCBub3QgdXNlIGl0ICh1bmxlc3MgeW91IGtub3cgd2h5IHlvdSB3YW50IHRvIHVzZSBpdCkuDQoNCiMjIDcuMiC75tbGzNjV99bY0qrQ1MbAt9YNCmBgYHtyfQ0KeGdiLnBsb3QuaW1wb3J0YW5jZShpbXBvcnRhbmNlX21hdHJpeCA9IGltcG9ydGFuY2UpIA0KYGBgDQoNCiMjIDcuMyC94srNseTBv7XE1/fTww0KYGBge3J9DQppbXBvcnRhbmNlUmF3IDwtIHhnYi5pbXBvcnRhbmNlKGZlYXR1cmVfbmFtZXMgPSBjb2xuYW1lcyh0cmFpbl9kYXRhKSwgbW9kZWwgPSBic3QsIGRhdGEgPSB0cmFpbl9kYXRhLCBsYWJlbCA9IHRyYWluX2xhYmVsKQ0KYGBgDQoNCmBgYHtyfQ0KaW1wb3J0YW5jZUNsZWFuIDwtIGltcG9ydGFuY2VSYXdbLGA6PWAoQ292ZXI9TlVMTCwgRnJlcXVlbmN5PU5VTEwpXQ0KYGBgDQoNCmBgYHtyfQ0KaW1wb3J0YW5jZUNsZWFuDQpgYGANCg0KDQojIyA3LjQgu+bWxsSj0M0NCmBgYHtyfQ0KYnN0X3BhcmFtcyA9IGxpc3Qob2JqZWN0aXZlID0gImJpbmFyeTpsb2dpc3RpYyIsDQogICAgICAgICAgICAgICAgICBldmFsX21ldHJpYyA9ICdhdWMnLA0KICAgICAgICAgICAgICAgICAgZXRhID0gYygwLjEyKSwNCiAgICAgICAgICAgICAgICAgIG1heF9kZXB0aCA9IGMoNCksDQogICAgICAgICAgICAgICAgICBtaW5fY2hpbGRfd2VpZ2h0ID0gYyg1KSwNCiAgICAgICAgICAgICAgICAgIGdhbW1hID0gYygzKSwNCiAgICAgICAgICAgICAgICAgIHN1YnNhbXBsZSA9IGMoMC42KSwNCiAgICAgICAgICAgICAgICAgIGNvbHNhbXBsZV9ieXRyZWUgPSBjKDAuOSksDQogICAgICAgICAgICAgICAgICBsYW1iZGEgPSBjKDEwMCksDQogICAgICAgICAgICAgICAgICBhbHBoYSA9IGMoMSksDQogICAgICAgICAgICAgICAgICBzY2FsZV9wb3Nfd2VpZ2h0ID0gYygxKSkNCmJzdF9ncmFwaCA9IHhnYi50cmFpbihkYXRhID0gZHRyYWluLA0KICAgICAgICAgICAgICAgICAgICAgIHBhcmFtcyA9IGJzdF9wYXJhbXMsDQogICAgICAgICAgICAgICAgICAgICAgIyBudGhyZWFkID0gMjAsDQogICAgICAgICAgICAgICAgICAgICAgbWlzc2luZyA9IE5BLA0KICAgICAgICAgICAgICAgICAgICAgIG5yb3VuZHMgPSAyLA0KICAgICAgICAgICAgICAgICAgICAgIHZlcmJvc2UgPSBGDQogICAgICAgICAgICAgICAgICAgICAgKQ0KYGBgDQoNCg0KYGBge3J9DQpsaWJyYXJ5KERpYWdyYW1tZVIpDQp4Z2IucGxvdC50cmVlKG1vZGVsID0gYnN0X2dyYXBoKQ0KYGBgDQoNCiMgOCDL5rv6ya3B1g0Ky+a7+smtwda6zczdtsjM4cn9vvay38r3trzK9NPavK+zycvjt6iho8G91tbL47eotrzSqtTa0ru49sr9vt28r8nP0bXBt7bguPa+9rLfyvejrMG91d+1xMf4sfDU2tPao7rL5rv6ya3B1tbQtcTDv9K7v8O+9rLfyvfKx7bAwaK1xKOstvjU2szdtsjM4cn9vvay38r31tCjrMO/0ru/w8r3trzKx9TattTHsNK7v8PK97340NDQ3tX9oaMNCg0KIyMgOC4xIMSj0M3RtcG3DQrNqLn9WEdCb29zdNKyv8nS1Mq1z9bL5rv6ya3B1qOsz8LD5r2owaLTyTEwMDC/w772st/K99fps8m1xMvmu/rJrcHWoaMNCmBgYHtyfQ0KcmYgPC0geGdiLnRyYWluKGRhdGEgPSBkdHJhaW4sIG1heF9kZXB0aCA9IDQsIG51bV9wYXJhbGxlbF90cmVlID0gMTAwMCwgc3Vic2FtcGxlID0gMC44LCBjb2xzYW1wbGVfYnl0cmVlID0wLjgsIG5yb3VuZHMgPSAxLCBvYmplY3RpdmUgPSAiYmluYXJ5OmxvZ2lzdGljIiwgZXZhbF9tZXRyaWMgPSAnYXVjJykNCmBgYA0KDQojIyA4LjIgxKPQzcbAvNsNCmBgYHtyfQ0KcmZfcHJlZCA8LSBwcmVkaWN0KHJmLCBkdHJhaW4pDQpjYWxjX2F1Y19hbmRfa3MocmZfcHJlZCwgdHJhaW5fbGFiZWwpDQpgYGANCg0KYGBge3J9DQptb2RlbHJvYyA9IHJvYyh0cmFpbl9sYWJlbCwgcmZfcHJlZCwgdGhyZXNob2xkcyA9IDAuNSkNCnBsb3QobW9kZWxyb2MscHJpbnQuYXVjPVQsYXVjLnBvbHlnb249VCxncmlkPWMoMC4xLDAuMiksZ3JpZC5jb2w9YygnZ3JlZW4nLCdyZWQnKSxtYXguYXVjLnBvbHlnb249VCxhdWMucG9seWdvbi5jb2w9J3NreWJsdWUnLHByaW50LnRocmVzPVQpDQpgYGANCg0KYGBge3J9DQpyZl9wcmVkIDwtIGlmX2Vsc2UocmZfcHJlZCA+IDAuNSwgMSwgMCkNCmNvbmZ1c2lvbk1hdHJpeChyZl9wcmVkLCB0cmFpbl9sYWJlbCkNCmBgYA0KDQojIyA4LjMgzNjV99bY0qrQ1MbAt9YNCmBgYHtyfQ0KaW1wb3J0YW5jZSA8LSB4Z2IuaW1wb3J0YW5jZShmZWF0dXJlX25hbWVzID0gY29sbmFtZXModHJhaW5fZGF0YSksIG1vZGVsID0gcmYpDQppbXBvcnRhbmNlDQpgYGANCg0KYGBge3J9DQp4Z2IucGxvdC5pbXBvcnRhbmNlKGltcG9ydGFuY2VfbWF0cml4ID0gaW1wb3J0YW5jZSkgDQpgYGANCg==